Implementing Gaussian process inference with neural networks.
نویسندگان
چکیده
Gaussian processes compare favourably with backpropagation neural networks as a tool for regression, and Bayesian neural networks have Gaussian process behaviour when the number of hidden neurons tends to infinity. We describe a simple recurrent neural network with connection weights trained by one-shot Hebbian learning. This network amounts to a dynamical system which relaxes to a stable state in which it generates predictions identical to those of Gaussian process regression. In effect an infinite number of hidden units in a feed-forward architecture can be replaced by a merely finite number, together with recurrent connections.
منابع مشابه
Eecient Implementation of Gaussian Processes
Neural networks and Bayesian inference provide a useful framework within which to solve regression problems. However their parameterization means that the Bayesian analysis of neural networks can be diicult. In this paper, we investigate a method for regression using Gaussian process priors which allows exact Bayesian analysis using matrix manipulations. We discuss the workings of the method in...
متن کاملForecasting Industrial Production in Iran: A Comparative Study of Artificial Neural Networks and Adaptive Nero-Fuzzy Inference System
Forecasting industrial production is essential for efficient planning by managers. Although there are many statistical and mathematical methods for prediction, the use of intelligent algorithms with desirable features has made significant progress in recent years. The current study compared the accuracy of the Artificial Neural Networks (ANN) and Adaptive Nero-Fuzzy Inference System (ANFIS) app...
متن کاملAccelerating Deep Gaussian Processes Inference with Arc-Cosine Kernels
Deep Gaussian Processes (DGPs) are probabilistic deep models obtained by stacking multiple layers implemented through Gaussian Processes (GPs). Although attractive from a theoretical point of view, learning DGPs poses some significant computational challenges that arguably hinder their application to a wider variety of problems for which Deep Neural Networks (DNNs) are the preferred choice. We ...
متن کاملBayesian Convolutional Neural Networks with Bernoulli Approximate Variational Inference
We present an efficient Bayesian convolutional neural network (convnet). The model offers better robustness to over-fitting on small data than traditional approaches. This is by placing a probability distribution over the convnet’s kernels (also known as filters). We approximate the model’s intractable posterior with Bernoulli variational distributions. This requires no additional model paramet...
متن کاملNovel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection
In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- International journal of neural systems
دوره 16 5 شماره
صفحات -
تاریخ انتشار 2006